# Biomedical Text Processing
Modernbert Base Tr Uncased
MIT
Turkish pre-trained model based on ModernBERT architecture, supporting 8192 context length with excellent performance across multiple domains
Large Language Model
Transformers Other

M
artiwise-ai
159
9
Extractive Summarization
MIT
This model is a fine-tuned version of t5-small, optimized for summarizing scientific and medical texts using the PubMed dataset.
Text Generation English
E
nyamuda
94
0
Biomednlp BiomedBERT Large Uncased Abstract
MIT
BiomedBERT is a large-scale language model pretrained from scratch on PubMed abstracts, specifically designed to enhance performance in biomedical natural language processing tasks.
Large Language Model
Transformers English

B
microsoft
637
18
Med KEBERT
Openrail
This is a BERT-based pre-trained language model for the biomedical domain, suitable for processing biomedical text data.
Large Language Model
Transformers English

M
xmcmic
769
1
Roberta Es Clinical Trials Ner
RoBERTa-based Spanish clinical trial text named entity recognition model capable of detecting 4 medical semantic groups
Sequence Labeling
Transformers Spanish

R
lcampillos
277
10
Bio Bert Ft
A fine-tuned model based on BioBERT for the biomedical domain, achieving an F1 score of 0.8621 on specific tasks
Large Language Model
Transformers

B
ericntay
15
0
Stanford Deidentifier With Radiology Reports And I2b2
MIT
Transformer-based automated de-identification system for radiology reports, achieving privacy protection by detecting Protected Health Information (PHI) and replacing it with realistic surrogate values
Sequence Labeling
Transformers English

S
StanfordAIMI
126
6
Stanford Deidentifier Only I2b2
MIT
Transformer-based automated de-identification system for radiology reports, combining rule-based methods for high-precision PHI recognition and replacement
Sequence Labeling
Transformers English

S
StanfordAIMI
98
5
Mt5 Chinese Small
An abstract generation model fine-tuned from mT5-small, supporting Chinese text summarization tasks
Text Generation
Transformers Chinese

M
yihsuan
36
7
Biolinkbert Large
Apache-2.0
BioLinkBERT is a biomedical language model pre-trained on PubMed abstracts and citation links, enhancing performance through cross-document knowledge integration.
Large Language Model
Transformers English

B
michiyasunaga
3,152
35
Covidbert Nli
A BERT model trained on the CORD19 coronavirus research paper dataset, fine-tuned for natural language inference tasks to generate universal sentence embeddings
Text Embedding
C
gsarti
26
0
Biobertpt Clin
BioBERTpt is a Portuguese clinical and biomedical BERT model based on the BERT architecture, specifically optimized for clinical named entity recognition tasks.
Large Language Model Other
B
pucpr
109
11
Biobertpt All
BERT-based Portuguese model trained on clinical records and biomedical literature
Large Language Model Other
B
pucpr
1,460
23
Pubmedbert Abstract Cord19
MIT
Biomedical text processing model fine-tuned on the CORD-19 abstract dataset based on PubMedBERT
Large Language Model
Transformers

P
pritamdeka
16
0
Rubiobert
RuBioRoBERTa is a RoBERTa model pretrained for Russian biomedical text, specifically designed for natural language processing tasks in the biomedical domain.
Large Language Model
Transformers

R
alexyalunin
686
1
Featured Recommended AI Models